# Multilingual Mathematical Reasoning
RPT DeepSeek R1 0528 Qwen3 8B
Apache-2.0
This model is a fine-tuned version based on DeepSeek-R1-0528-Qwen3-8B, trained using TRL and GRPO methods, focusing on improving mathematical reasoning ability.
Large Language Model
Transformers Supports Multiple Languages

R
ykarout
401
1
Parallel 7B
Apache-2.0
MathOctopus is a multilingual mathematical reasoning large language model based on the LLaMA 2 architecture, supporting 10 languages and specializing in solving mathematical problems.
Large Language Model
Transformers Supports Multiple Languages

P
Mathoctopus
14
2
Featured Recommended AI Models